Scaled Bregman divergences in a Tsallis scenario
نویسندگان
چکیده
There exist two different versions of the Kullback-Leibler divergence (K-Ld) in Tsallis statistics, namely the usual generalized K-Ld and the generalized Bregman K-Ld. Problems have been encountered in trying to reconcile them. A condition for consistency between these two generalized K-Ld-forms by recourse to the additive duality of Tsallis statistics is derived. It is also shown that the usual generalized K-Ld subjected to this additive duality, known as the dual generalized K-Ld, is a scaled Bregman divergence. This leads to an interesting conclusion: the dual generalized mutual information is a scaled Bregman information. The utility and implications of these results are discussed.
منابع مشابه
Generalized Kullback-Leibler Divergence Minimization within a Scaled Bregman Framework
The generalized Kullback-Leibler divergence (K-Ld) in Tsallis statistics subjected to the additive duality of generalized statistics (dual generalized K-Ld) is reconciled with the theory of Bregman divergences for expectations defined by normal averages, within a measure theoretic framework. Specifically, it is demonstrated that the dual generalized K-Ld is a scaled Bregman divergence. The Pyth...
متن کاملDeformed Statistics Kullback-Leibler Divergence Minimization within a Scaled Bregman Framework
The generalized Kullback-Leibler divergence (K-Ld) in Tsallis statistics [constrained by the additive duality of generalized statistics (dual generalized K-Ld)] is here reconciled with the theory of Bregman divergences for expectations defined by normal averages, within a measure-theoretic framework. Specifically, it is demonstrated that the dual generalized K-Ld is a scaled Bregman divergence....
متن کاملRe-examination of Bregman functions and new properties of their divergences
The Bregman divergence (Bregman distance, Bregman measure of distance) is a certain useful substitute for a distance, obtained from a well-chosen function (the “Bregman function”). Bregman functions and divergences have been extensively investigated during the last decades and have found applications in optimization, operations research, information theory, nonlinear analysis, machine learning ...
متن کاملGeneralized Statistics Framework for Rate Distortion Theory with Bregman Divergences
A variational principle for the rate distortion (RD) theory with Bregman divergences is formulated within the ambit of the generalized (nonextensive) statistics of Tsallis. The TsallisBregman RD lower bound is established. Alternate minimization schemes for the generalized Bregman RD (GBRD) theory are derived. A computational strategy to implement the GBRD model is presented. The efficacy of th...
متن کاملA scaled Bregman theorem with applications
Bregman divergences play a central role in the design and analysis of a range of machine learning algorithms through a handful of popular theorems. We present a new theorem which shows that “Bregman distortions” (employing a potentially non-convex generator) may be exactly re-written as a scaled Bregman divergence computed over transformed data. This property can be viewed from the standpoints ...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید
ثبت ناماگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید
ورودعنوان ژورنال:
- CoRR
دوره abs/1101.2190 شماره
صفحات -
تاریخ انتشار 2011